Goto

Collaborating Authors

 mating pool


NSGA-PINN: A Multi-Objective Optimization Method for Physics-Informed Neural Network Training

Lu, Binghang, Moya, Christian B., Lin, Guang

arXiv.org Artificial Intelligence

This paper presents NSGA-PINN, a multi-objective optimization framework for effective training of Physics-Informed Neural Networks (PINNs). The proposed framework uses the Non-dominated Sorting Genetic Algorithm (NSGA-II) to enable traditional stochastic gradient optimization algorithms (e.g., ADAM) to escape local minima effectively. Additionally, the NSGA-II algorithm enables satisfying the initial and boundary conditions encoded into the loss function during physics-informed training precisely. We demonstrate the effectiveness of our framework by applying NSGA-PINN to several ordinary and partial differential equation problems. In particular, we show that the proposed framework can handle challenging inverse problems with noisy data.


Evolution of a salesman: A complete genetic algorithm tutorial for Python

#artificialintelligence

In this tutorial, we'll be using a GA to find a solution to the traveling salesman problem (TSP). Let's start with a few definitions, rephrased in the context of the TSP: Now, let's see this in action. While each part of our GA is built from scratch, we'll use a few standard packages to make things easier: We first create a City class that will allow us to create and handle our cities. These are simply our (x, y) coordinates. Within the City class, we add a distance calculation (making use of the Pythagorean theorem) in line 6 and a cleaner way to output the cities as coordinates with __repr__ in line 12. We'll also create a Fitness class.


Introduction to Optimization with Genetic Algorithm

@machinelearnbot

Bio: Ahmed Gad received his B.Sc. degree with excellent with honors in information technology from the Faculty of Computers and Information (FCI), Menoufia University, Egypt, in July 2015. For being ranked first in his faculty, he was recommended to work as a teaching assistant in one of the Egyptian institutes in 2015 and then in 2016 to work as a teaching assistant and a researcher in his faculty. His current research interests include deep learning, machine learning, artificial intelligence, digital signal processing, and computer vision.


Introduction to Optimization with Genetic Algorithm

#artificialintelligence

Selection of the optimal parameters for machine learning tasks is challenging. Some results may be bad not because the data is noisy or the used learning algorithm is weak, but due to the bad selection of the parameters values. This article gives a brief introduction about evolutionary algorithms (EAs) and describes genetic algorithm (GA) which is one of the simplest random-based EAs. Suppose that a data scientist has an image dataset divided into a number of classes and an image classifier is to be created. After the data scientist investigated the dataset, the K-nearest neighbor (KNN) seems to be a good option.